Console Output
Training and evaluating model for: Lamp
Dataset length: 28162 windows
NILMModel(
(conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
(lstm): LSTM(9, 256, num_layers=5, batch_first=True, dropout=0.1)
(dropout): Dropout(p=0.1, inplace=False)
(relu): ReLU()
(output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.004881
Validation Loss: 0.004891
Epoch [2/300], Train Loss: 0.004718
Validation Loss: 0.004898
Epoch [3/300], Train Loss: 0.004771
Validation Loss: 0.004881
Epoch [4/300], Train Loss: 0.004788
Validation Loss: 0.004875
Epoch [5/300], Train Loss: 0.004706
Validation Loss: 0.004868
Epoch [6/300], Train Loss: 0.004906
Validation Loss: 0.004861
Epoch [7/300], Train Loss: 0.004704
Validation Loss: 0.004861
Epoch [8/300], Train Loss: 0.004699
Validation Loss: 0.004867
Epoch [9/300], Train Loss: 0.005044
Validation Loss: 0.004943
Epoch [10/300], Train Loss: 0.004723
Validation Loss: 0.004864
Epoch [11/300], Train Loss: 0.004699
Validation Loss: 0.004860
Epoch [12/300], Train Loss: 0.004697
Validation Loss: 0.004859
Epoch [13/300], Train Loss: 0.004695
Validation Loss: 0.004861
Epoch [14/300], Train Loss: 0.004694
Validation Loss: 0.004849
Epoch [15/300], Train Loss: 0.004686
Validation Loss: 0.004836
Epoch [16/300], Train Loss: 0.004673
Validation Loss: 0.004806
Epoch [17/300], Train Loss: 0.004636
Validation Loss: 0.004703
Epoch [18/300], Train Loss: 0.004245
Validation Loss: 0.004463
Epoch [19/300], Train Loss: 0.004777
Validation Loss: 0.004903
Epoch [20/300], Train Loss: 0.004727
Validation Loss: 0.004886
Epoch [21/300], Train Loss: 0.004703
Validation Loss: 0.004897
Epoch [22/300], Train Loss: 0.004636
Validation Loss: 0.004655
Epoch [23/300], Train Loss: 0.004620
Validation Loss: 0.004887
Epoch [24/300], Train Loss: 0.004694
Validation Loss: 0.004794
Epoch [25/300], Train Loss: 0.004302
Validation Loss: 0.003737
Epoch [26/300], Train Loss: 0.002655
Validation Loss: 0.002003
Epoch [27/300], Train Loss: 0.001760
Validation Loss: 0.001505
Epoch [28/300], Train Loss: 0.001413
Validation Loss: 0.001252
Epoch [29/300], Train Loss: 0.001231
Validation Loss: 0.001173
Epoch [30/300], Train Loss: 0.001123
Validation Loss: 0.001081
Epoch [31/300], Train Loss: 0.001030
Validation Loss: 0.000892
Epoch [32/300], Train Loss: 0.000869
Validation Loss: 0.000792
Epoch [33/300], Train Loss: 0.000849
Validation Loss: 0.000818
Epoch [34/300], Train Loss: 0.000769
Validation Loss: 0.000704
Epoch [35/300], Train Loss: 0.000720
Validation Loss: 0.000689
Epoch [36/300], Train Loss: 0.000667
Validation Loss: 0.000616
Epoch [37/300], Train Loss: 0.000618
Validation Loss: 0.000555
Epoch [38/300], Train Loss: 0.000592
Validation Loss: 0.000548
Epoch [39/300], Train Loss: 0.000552
Validation Loss: 0.000501
Epoch [40/300], Train Loss: 0.000523
Validation Loss: 0.000480
Epoch [41/300], Train Loss: 0.000555
Validation Loss: 0.000883
Epoch [42/300], Train Loss: 0.000566
Validation Loss: 0.000510
Epoch [43/300], Train Loss: 0.000528
Validation Loss: 0.000844
Epoch [44/300], Train Loss: 0.000511
Validation Loss: 0.000471
Epoch [45/300], Train Loss: 0.000452
Validation Loss: 0.000445
Epoch [46/300], Train Loss: 0.000430
Validation Loss: 0.000415
Epoch [47/300], Train Loss: 0.000408
Validation Loss: 0.000405
Epoch [48/300], Train Loss: 0.000396
Validation Loss: 0.000418
Epoch [49/300], Train Loss: 0.000389
Validation Loss: 0.000367
Epoch [50/300], Train Loss: 0.000362
Validation Loss: 0.000433
Epoch [51/300], Train Loss: 0.000385
Validation Loss: 0.000347
Epoch [52/300], Train Loss: 0.000476
Validation Loss: 0.000360
Epoch [53/300], Train Loss: 0.000345
Validation Loss: 0.000329
Epoch [54/300], Train Loss: 0.000315
Validation Loss: 0.000322
Epoch [55/300], Train Loss: 0.000359
Validation Loss: 0.000340
Epoch [56/300], Train Loss: 0.000356
Validation Loss: 0.000306
Epoch [57/300], Train Loss: 0.000302
Validation Loss: 0.000293
Epoch [58/300], Train Loss: 0.000289
Validation Loss: 0.000277
Epoch [59/300], Train Loss: 0.000325
Validation Loss: 0.000300
Epoch [60/300], Train Loss: 0.000293
Validation Loss: 0.000274
Epoch [61/300], Train Loss: 0.000275
Validation Loss: 0.000269
Epoch [62/300], Train Loss: 0.000269
Validation Loss: 0.000257
Epoch [63/300], Train Loss: 0.000258
Validation Loss: 0.000250
Epoch [64/300], Train Loss: 0.000247
Validation Loss: 0.000240
Epoch [65/300], Train Loss: 0.000247
Validation Loss: 0.000233
Epoch [66/300], Train Loss: 0.000238
Validation Loss: 0.000239
Epoch [67/300], Train Loss: 0.000232
Validation Loss: 0.000224
Epoch [68/300], Train Loss: 0.000226
Validation Loss: 0.000225
Epoch [69/300], Train Loss: 0.000236
Validation Loss: 0.000238
Epoch [70/300], Train Loss: 0.000315
Validation Loss: 0.000257
Epoch [71/300], Train Loss: 0.000238
Validation Loss: 0.000221
Epoch [72/300], Train Loss: 0.000221
Validation Loss: 0.000219
Epoch [73/300], Train Loss: 0.000219
Validation Loss: 0.000227
Epoch [74/300], Train Loss: 0.000208
Validation Loss: 0.000210
Epoch [75/300], Train Loss: 0.000204
Validation Loss: 0.000205
Epoch [76/300], Train Loss: 0.000361
Validation Loss: 0.000318
Epoch [77/300], Train Loss: 0.000280
Validation Loss: 0.000399
Epoch [78/300], Train Loss: 0.000393
Validation Loss: 0.000236
Epoch [79/300], Train Loss: 0.000234
Validation Loss: 0.000230
Epoch [80/300], Train Loss: 0.000221
Validation Loss: 0.000208
Epoch [81/300], Train Loss: 0.000210
Validation Loss: 0.000213
Epoch [82/300], Train Loss: 0.000207
Validation Loss: 0.000199
Epoch [83/300], Train Loss: 0.000200
Validation Loss: 0.000202
Epoch [84/300], Train Loss: 0.000198
Validation Loss: 0.000193
Epoch [85/300], Train Loss: 0.000197
Validation Loss: 0.000193
Epoch [86/300], Train Loss: 0.000193
Validation Loss: 0.000186
Epoch [87/300], Train Loss: 0.000188
Validation Loss: 0.000189
Epoch [88/300], Train Loss: 0.000187
Validation Loss: 0.000189
Epoch [89/300], Train Loss: 0.000194
Validation Loss: 0.000185
Epoch [90/300], Train Loss: 0.000199
Validation Loss: 0.000186
Epoch [91/300], Train Loss: 0.000184
Validation Loss: 0.000176
Epoch [92/300], Train Loss: 0.000180
Validation Loss: 0.000179
Epoch [93/300], Train Loss: 0.000176
Validation Loss: 0.000174
Epoch [94/300], Train Loss: 0.000195
Validation Loss: 0.000176
Epoch [95/300], Train Loss: 0.000171
Validation Loss: 0.000173
Epoch [96/300], Train Loss: 0.000169
Validation Loss: 0.000169
Epoch [97/300], Train Loss: 0.000170
Validation Loss: 0.000172
Epoch [98/300], Train Loss: 0.000173
Validation Loss: 0.000383
Epoch [99/300], Train Loss: 0.000194
Validation Loss: 0.000171
Epoch [100/300], Train Loss: 0.000166
Validation Loss: 0.000165
Epoch [101/300], Train Loss: 0.000163
Validation Loss: 0.000188
Epoch [102/300], Train Loss: 0.000162
Validation Loss: 0.000160
Epoch [103/300], Train Loss: 0.000160
Validation Loss: 0.000160
Epoch [104/300], Train Loss: 0.000160
Validation Loss: 0.000158
Epoch [105/300], Train Loss: 0.000156
Validation Loss: 0.000160
Epoch [106/300], Train Loss: 0.000154
Validation Loss: 0.000161
Epoch [107/300], Train Loss: 0.000154
Validation Loss: 0.000154
Epoch [108/300], Train Loss: 0.000176
Validation Loss: 0.000166
Epoch [109/300], Train Loss: 0.000151
Validation Loss: 0.000154
Epoch [110/300], Train Loss: 0.000148
Validation Loss: 0.000153
Epoch [111/300], Train Loss: 0.000147
Validation Loss: 0.000147
Epoch [112/300], Train Loss: 0.000144
Validation Loss: 0.000156
Epoch [113/300], Train Loss: 0.000152
Validation Loss: 0.000174
Epoch [114/300], Train Loss: 0.000147
Validation Loss: 0.000150
Epoch [115/300], Train Loss: 0.000141
Validation Loss: 0.000147
Epoch [116/300], Train Loss: 0.000153
Validation Loss: 0.000179
Epoch [117/300], Train Loss: 0.000148
Validation Loss: 0.000147
Epoch [118/300], Train Loss: 0.000134
Validation Loss: 0.000162
Epoch [119/300], Train Loss: 0.000140
Validation Loss: 0.000144
Epoch [120/300], Train Loss: 0.000136
Validation Loss: 0.000144
Epoch [121/300], Train Loss: 0.000133
Validation Loss: 0.000139
Epoch [122/300], Train Loss: 0.000130
Validation Loss: 0.000149
Epoch [123/300], Train Loss: 0.000131
Validation Loss: 0.000144
Epoch [124/300], Train Loss: 0.000130
Validation Loss: 0.000143
Epoch [125/300], Train Loss: 0.000127
Validation Loss: 0.000138
Epoch [126/300], Train Loss: 0.000127
Validation Loss: 0.000137
Epoch [127/300], Train Loss: 0.000130
Validation Loss: 0.000153
Epoch [128/300], Train Loss: 0.000144
Validation Loss: 0.000147
Epoch [129/300], Train Loss: 0.000125
Validation Loss: 0.000137
Epoch [130/300], Train Loss: 0.000133
Validation Loss: 0.000153
Epoch [131/300], Train Loss: 0.000125
Validation Loss: 0.000134
Epoch [132/300], Train Loss: 0.000122
Validation Loss: 0.000134
Epoch [133/300], Train Loss: 0.000123
Validation Loss: 0.000137
Epoch [134/300], Train Loss: 0.000122
Validation Loss: 0.000133
Epoch [135/300], Train Loss: 0.000119
Validation Loss: 0.000133
Epoch [136/300], Train Loss: 0.000119
Validation Loss: 0.000133
Epoch [137/300], Train Loss: 0.000118
Validation Loss: 0.000131
Epoch [138/300], Train Loss: 0.000118
Validation Loss: 0.000135
Epoch [139/300], Train Loss: 0.000120
Validation Loss: 0.000132
Epoch [140/300], Train Loss: 0.000120
Validation Loss: 0.000132
Epoch [141/300], Train Loss: 0.000117
Validation Loss: 0.000128
Epoch [142/300], Train Loss: 0.000113
Validation Loss: 0.000128
Epoch [143/300], Train Loss: 0.000114
Validation Loss: 0.000135
Epoch [144/300], Train Loss: 0.000116
Validation Loss: 0.000132
Epoch [145/300], Train Loss: 0.000113
Validation Loss: 0.000132
Epoch [146/300], Train Loss: 0.000116
Validation Loss: 0.000132
Epoch [147/300], Train Loss: 0.000116
Validation Loss: 0.000129
Epoch [148/300], Train Loss: 0.000114
Validation Loss: 0.000128
Epoch [149/300], Train Loss: 0.000111
Validation Loss: 0.000129
Epoch [150/300], Train Loss: 0.000110
Validation Loss: 0.000128
Epoch [151/300], Train Loss: 0.000107
Validation Loss: 0.000125
Epoch [152/300], Train Loss: 0.000109
Validation Loss: 0.000126
Epoch [153/300], Train Loss: 0.000107
Validation Loss: 0.000125
Epoch [154/300], Train Loss: 0.000109
Validation Loss: 0.000134
Epoch [155/300], Train Loss: 0.000125
Validation Loss: 0.000151
Epoch [156/300], Train Loss: 0.000145
Validation Loss: 0.000129
Epoch [157/300], Train Loss: 0.000108
Validation Loss: 0.000128
Epoch [158/300], Train Loss: 0.000105
Validation Loss: 0.000124
Epoch [159/300], Train Loss: 0.000104
Validation Loss: 0.000123
Epoch [160/300], Train Loss: 0.000103
Validation Loss: 0.000129
Epoch [161/300], Train Loss: 0.000104
Validation Loss: 0.000135
Epoch [162/300], Train Loss: 0.000110
Validation Loss: 0.000122
Epoch [163/300], Train Loss: 0.000106
Validation Loss: 0.000125
Epoch [164/300], Train Loss: 0.000103
Validation Loss: 0.000122
Epoch [165/300], Train Loss: 0.000104
Validation Loss: 0.000121
Epoch [166/300], Train Loss: 0.000101
Validation Loss: 0.000122
Epoch [167/300], Train Loss: 0.000104
Validation Loss: 0.000122
Epoch [168/300], Train Loss: 0.000107
Validation Loss: 0.000128
Epoch [169/300], Train Loss: 0.000106
Validation Loss: 0.000122
Epoch [170/300], Train Loss: 0.000101
Validation Loss: 0.000118
Epoch [171/300], Train Loss: 0.000099
Validation Loss: 0.000118
Epoch [172/300], Train Loss: 0.000100
Validation Loss: 0.000128
Epoch [173/300], Train Loss: 0.000101
Validation Loss: 0.000120
Epoch [174/300], Train Loss: 0.000100
Validation Loss: 0.000121
Epoch [175/300], Train Loss: 0.000100
Validation Loss: 0.000121
Epoch [176/300], Train Loss: 0.000099
Validation Loss: 0.000117
Epoch [177/300], Train Loss: 0.000098
Validation Loss: 0.000118
Epoch [178/300], Train Loss: 0.000097
Validation Loss: 0.000128
Epoch [179/300], Train Loss: 0.000103
Validation Loss: 0.000122
Epoch [180/300], Train Loss: 0.000098
Validation Loss: 0.000116
Epoch [181/300], Train Loss: 0.000098
Validation Loss: 0.000115
Epoch [182/300], Train Loss: 0.000099
Validation Loss: 0.000118
Epoch [183/300], Train Loss: 0.000096
Validation Loss: 0.000115
Epoch [184/300], Train Loss: 0.000096
Validation Loss: 0.000120
Epoch [185/300], Train Loss: 0.000096
Validation Loss: 0.000116
Epoch [186/300], Train Loss: 0.000095
Validation Loss: 0.000117
Epoch [187/300], Train Loss: 0.000093
Validation Loss: 0.000119
Epoch [188/300], Train Loss: 0.000102
Validation Loss: 0.000151
Epoch [189/300], Train Loss: 0.000126
Validation Loss: 0.000115
Epoch [190/300], Train Loss: 0.000097
Validation Loss: 0.000118
Epoch [191/300], Train Loss: 0.000101
Validation Loss: 0.000116
Epoch [192/300], Train Loss: 0.000093
Validation Loss: 0.000114
Epoch [193/300], Train Loss: 0.000093
Validation Loss: 0.000115
Epoch [194/300], Train Loss: 0.000094
Validation Loss: 0.000119
Epoch [195/300], Train Loss: 0.000095
Validation Loss: 0.000114
Epoch [196/300], Train Loss: 0.000092
Validation Loss: 0.000114
Epoch [197/300], Train Loss: 0.000091
Validation Loss: 0.000112
Epoch [198/300], Train Loss: 0.000090
Validation Loss: 0.000113
Epoch [199/300], Train Loss: 0.000091
Validation Loss: 0.000117
Epoch [200/300], Train Loss: 0.000091
Validation Loss: 0.000112
Epoch [201/300], Train Loss: 0.000091
Validation Loss: 0.000117
Epoch [202/300], Train Loss: 0.000093
Validation Loss: 0.000125
Epoch [203/300], Train Loss: 0.000091
Validation Loss: 0.000112
Epoch [204/300], Train Loss: 0.000090
Validation Loss: 0.000114
Epoch [205/300], Train Loss: 0.000090
Validation Loss: 0.000113
Epoch [206/300], Train Loss: 0.000091
Validation Loss: 0.000113
Epoch [207/300], Train Loss: 0.000090
Validation Loss: 0.000117
Epoch [208/300], Train Loss: 0.000094
Validation Loss: 0.000117
Epoch [209/300], Train Loss: 0.000090
Validation Loss: 0.000114
Epoch [210/300], Train Loss: 0.000089
Validation Loss: 0.000109
Epoch [211/300], Train Loss: 0.000089
Validation Loss: 0.000113
Epoch [212/300], Train Loss: 0.000088
Validation Loss: 0.000111
Epoch [213/300], Train Loss: 0.000088
Validation Loss: 0.000112
Epoch [214/300], Train Loss: 0.000089
Validation Loss: 0.000112
Epoch [215/300], Train Loss: 0.000086
Validation Loss: 0.000109
Epoch [216/300], Train Loss: 0.000087
Validation Loss: 0.000118
Epoch [217/300], Train Loss: 0.000087
Validation Loss: 0.000110
Epoch [218/300], Train Loss: 0.000087
Validation Loss: 0.000117
Epoch [219/300], Train Loss: 0.000095
Validation Loss: 0.000108
Epoch [220/300], Train Loss: 0.000087
Validation Loss: 0.000108
Epoch [221/300], Train Loss: 0.000087
Validation Loss: 0.000111
Epoch [222/300], Train Loss: 0.000087
Validation Loss: 0.000109
Epoch [223/300], Train Loss: 0.000086
Validation Loss: 0.000103
Epoch [224/300], Train Loss: 0.000085
Validation Loss: 0.000110
Epoch [225/300], Train Loss: 0.000085
Validation Loss: 0.000110
Epoch [226/300], Train Loss: 0.000087
Validation Loss: 0.000121
Epoch [227/300], Train Loss: 0.000111
Validation Loss: 0.000116
Epoch [228/300], Train Loss: 0.000090
Validation Loss: 0.000112
Epoch [229/300], Train Loss: 0.000089
Validation Loss: 0.000123
Epoch [230/300], Train Loss: 0.000090
Validation Loss: 0.000112
Epoch [231/300], Train Loss: 0.000089
Validation Loss: 0.000118
Epoch [232/300], Train Loss: 0.000093
Validation Loss: 0.000113
Epoch [233/300], Train Loss: 0.000087
Validation Loss: 0.000110
Early stopping triggered
Evaluating model for: Lamp
Validation MAE: 0.164326 W
Validation MSE: 2.556795 W²
Validation RMSE: 1.598998 W
Signal Aggregate Error (SAE): 0.014234
Normalized Disaggregation Error (NDE): 0.122818
Training and Validation Loss
Interactive Plot